Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Semantic relation extraction model via attention based neural Turing machine
ZHANG Runyan, MENG Fanrong, ZHOU Yong, LIU Bing
Journal of Computer Applications    2018, 38 (7): 1831-1838.   DOI: 10.11772/j.issn.1001-9081.2017123009
Abstract753)      PDF (1298KB)(668)       Save
Focusing on the problem of poor memory in long sentences and the lack of core words' influence in semantic relation extraction, an Attention based bidirectional Neural Turing Machine (Ab-NTM) model was proposed. Instead of a Recurrent Neural Network (RNN), a Neural Turing Machine (NTM) was used firstly, and a Long Short-Term Memory (LSTM) network was acted as a controller, which contained larger and non-interfering storage, and it could hold longer memories than the RNN. Secondly, an attention layer was used to organize the context information on the word level so that the model could pay attention to the core words in sentences. Finally, the labels were gotten through the classifier. Experiments on the SemEval-2010 Task 8 dataset show that the proposed model outperforms most state-of-the-art methods with an 86.2% F1-score.
Reference | Related Articles | Metrics